Feedforward Approximations to Dynamic Recurrent Network Architectures

نویسنده

  • Dylan R. Muir
چکیده

Recurrent neural network architectures can have useful computational properties, with complex temporal dynamics and input-sensitive attractor states. However, evaluation of recurrent dynamic architectures requires solving systems of differential equations, and the number of evaluations required to determine their response to a given input can vary with the input or can be indeterminate altogether in the case of oscillations or instability. In feedforward networks, by contrast, only a single pass through the network is needed to determine the response to a given input. Modern machine learning systems are designed to operate efficiently on feedforward architectures. We hypothesized that two-layer feedforward architectures with simple, deterministic dynamics could approximate the responses of single-layer recurrent network architectures. By identifying the fixed-point responses of a given recurrent network, we trained two-layer networks to directly approximate the fixed-point response to a given input. These feedforward networks then embodied useful computations, including competitive interactions, information transformations, and noise rejection. Our approach was able to find useful approximations to recurrent networks, which can then be evaluated in linear and deterministic time complexity.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Architectural Complexity Measures of Recurrent Neural Networks

In this paper, we systematically analyse the connecting architectures of recurrent neural networks (RNNs). Our main contribution is twofold: first, we present a rigorous graphtheoretic framework describing the connecting architectures of RNNs in general. Second, we propose three architecture complexity measures of RNNs: (a) the recurrent depth, which captures the RNN’s over-time nonlinear compl...

متن کامل

Coevolution as an Autonomous Learning Strategy for Neuromodules

The training of artificial neural networks to solve a particular problem usually affects only the weight space of the network. In contrast, the network architecture has to be predefined by the user. For this pretraining choice of network architecture profound skill and experience may be required on the part of the user — an improperly predefined architecture easily renders a learning problem in...

متن کامل

Efficient Evolution of Asymetric Recurrent Neural Networks Using a Two-dimensional Representation

Recurrent neural networks are particularly useful for processing time sequences and simulating dynamical systems. However, methods for building recurrent architectures have been hindered by the fact that available training algorithms are considerably more complex than those for feedforward networks. In this paper, we present a new method to build recurrent neural networks based on evolutionary ...

متن کامل

Behavior Emergence in Autonomous Robot Control by Means of Feedforward and Recurrent Neural Networks

We study the emergence of intelligent behavior within a simple intelligent agent. Cognitive agent functions are realized by mechanisms based on neural networks and evolutionary algorithms. The evolutionary algorithm is responsible for the adaptation of a neural network parameters based on the performance of the embodied agent endowed by different neural network architectures. In experiments, we...

متن کامل

Vapnik-Chervonenkis Dimension of Recurrent Neural Networks

Most of the work on the Vapnik-Chervonenkis dimension of neural networks has been focused on feedforward networks. However, recurrent networks are also widely used in learning applications, in particular when time is a relevant parameter. This paper provides lower and upper bounds for the VC dimension of such networks. Several types of activation functions are discussed, including threshold, po...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Neural computation

دوره 30 2  شماره 

صفحات  -

تاریخ انتشار 2018